Goto

Collaborating Authors

 artificial intelligence produce synthetic sociopath


Will Artificial Intelligence Produce Synthetic Sociopaths? - AI Summary

#artificialintelligence

A Turing Test for Ethical Artificial Intelligence ("AI") Initially called "The Imitation Game," one version of the test imagines that a human evaluator communicates in writing with a human participant and a computer designed to emulate human language and responses. Encouraged by the power and possibility of natural-language programs and self-learning algorithms, technologists and ethicists have begun exploring whether machines might pass an Ethical Imitation Game -- whether they can convincingly imitate the empathy necessary for ethical deliberation and decision making. It is an axiom of Systems Analysis that the meaning of a system lies outside the system. Analogously, programs written to simulate empathy successfully will represent synthetic sociopaths, faking empathy to further the ends of their programmers. Those pursuing ethical AI might therefore want to ask themselves not What they seek, nor How they might attain it, but Why they want programs to simulate empathy.


Will Artificial Intelligence Produce Synthetic Sociopaths?

#artificialintelligence

Some day soon, computers may convincingly mimic human empathy. Will this lead to ethical artificial intelligence, or synthetic sociopathy? Technologists and ethicists have started to explore the form and meaning of a Turing Test for Ethical AI. Initially called "The Imitation Game," one version of the test imagines that a human evaluator communicates in writing with a human participant and a computer designed to emulate human language and responses. In Turing's view, if the evaluator could not distinguish between the human and the computer, the computer might be said to think, or at least to possess intelligence.